Home |
| Latest | About | Random
# 21 Linear spaces. ## A first definition. We have encounter sets of vectors with common features over and over, and it is time to give them a name. We define a set $U$ to be a **linear space** if we can write $U$ as the span of some set, that $U = \operatorname{span}(...)$. **Examples.** The set $U = \operatorname{span}\{\begin{pmatrix}1\\1 \end{pmatrix}\}$ is a linear space. The set $K = \operatorname{span}\{\begin{pmatrix}1\\1\end{pmatrix}\} + \begin{pmatrix}1\\0\end{pmatrix}$ is **not** a linear space. It might be clear why yet, but one cannot write $K$ as the span of some set. The set $P_{2}(\mathbb{R})=\{\text{all degree at most 2 polynomials with real coefficients}\}$ is a linear space! This is because we have $$ P_{2}(\mathbb{R})=\{a+bx+cx^{2}:a,b,c \in \mathbb{R}\} = \operatorname{span} (1,x,x^{2}). $$ The set of all $2\times 2$ real matrices, denote as $M_{2\times 2}(\mathbb{R})=\{\begin{pmatrix}a & b\\c & d\end{pmatrix}:a,b,c,d \in \mathbb{R}\}$ is a linear space. Indeed, we can write $$ M_{2\times 2}(\mathbb{R}) = \operatorname{span} \{\begin{pmatrix}1 & 0\\0 & 0\end{pmatrix},\begin{pmatrix}0 & 1\\0 & 0\end{pmatrix},\begin{pmatrix}0 & 0 \\ 1 & 0\end{pmatrix},\begin{pmatrix}0 & 0\\0 & 1\end{pmatrix}\}. $$ The empty set $\varnothing = \{\}$ is **not** a linear space. The set $\{\begin{pmatrix}0\\0\end{pmatrix}\}$ is a linear space. Indeed, $\{\begin{pmatrix}0\\0\end{pmatrix}\}=\operatorname{span}\{\begin{pmatrix}0\\0\end{pmatrix}\} = \operatorname{span}\{\}$. **Remark.** By this definition, we see that a linear space $U$ must have a spanning set. And in fact every linear space $U$ has a basis set. We can produces a basis set for a linear space $U$ by starting with a spanning set, and then remove all the redundant vectors (objects) to achieve linear independence, and we get a basis set for $U$. **Remark,** (But you can safely ignore this remark.) The fact that every linear space has a basis set is actually a subtle one. We won't encounter it usually but if the spanning set is infinite, then it's not clear why we can "just remove the redundant ones". This turns out to be **equivalent to the mathematical Axiom of Choice,** which states "if we have any collection of bags, and each bag is not empty, then we can pick one thing from each bag." So, by assuming the Axiom of Choice, we can prove that every linear space has a basis set; and if every linear space has a basis set, then Axiom of Choice must be true. **Remark.** A linear space here is also known as **vector space** or **vectorspace**. There is a bit more details to it, so I purposely use a different name, but this is a good start -- it is a span of some set. If you use the terminology linear space, reasonable mathematicians would understand. There is a bit more "nuance structures" to vectorspace, but we will unravel that slowly later. ## A characterization of linear space. As it turns out, we can define linear spaces in a different way, giving us a characterization. This amounts to checking three conditions. > **A characterization of linear space.** > A set $U$ is a linear space, that we can write $U=\operatorname{span}(...)$ if and only if all of the following three are satisfied: > (1) **$U$ has a zero element:** There exists an element $0_{U}\in U$ such that for each $x\in U$, we have $x+0_{U}=x$. > (2) **$U$ is closed under addition**: For each $x,y\in U$, we have $x+y\in U$. > (3) **$U$ is closed under scaling**: For each $x\in U$ and scalar $c$, we have $cx \in U$. **Note.** Condition (1) is also saying: A linear space is **not empty**. Often, using these three conditions is easier to check whether something is indeed a linear space, or show something is not a linear space (as opposed to needing to write out a span expression). **Example.** The set $K = \operatorname{span}\{\begin{pmatrix}1\\1\end{pmatrix}\} + \begin{pmatrix}1\\0\end{pmatrix}$ is **not** a linear space. Indeed, we note that the zero vector $\begin{pmatrix}0\\0\end{pmatrix}$ is not in $K$. Indeed, if to the contrary that $\begin{pmatrix}0\\0\end{pmatrix}$ is in $K$, then we can write $$ \begin{pmatrix}0\\0\end{pmatrix} = c\begin{pmatrix}1\\1\end{pmatrix}+\begin{pmatrix}1\\0\end{pmatrix}. $$But this means $0=c+1$ and $0=c$, a contradiction! Hence $K$ does not have the zero element, whence $K$ is **not** a linear space. **Example.** Is the set $G = \{\begin{pmatrix}t\\t^{2}\end{pmatrix}:t\in \mathbb{R}\}$ a linear space? Well, note that the vector $\begin{pmatrix}1\\1\end{pmatrix} \in G$. But if we consider the sum $\begin{pmatrix}1\\1\end{pmatrix} +\begin{pmatrix}1\\1\end{pmatrix} =\begin{pmatrix}2\\2\end{pmatrix}$, this sum is **not** in $G$. Indeed, if to the contrary that $\begin{pmatrix}2\\2\end{pmatrix} \in G$, then we have $\begin{pmatrix}2\\2\end{pmatrix} = \begin{pmatrix}t\\t^{2}\end{pmatrix}$ for some $t \in \mathbb{R}$. But this means $2=t$ and $2=t^{2}$, which is a contradiction. So $G$ is not closed under addition. Hence $G$ is **not** a linear space. Note, we can also show that $G$ is not closed under scaling as well, that $2\begin{pmatrix}1\\1\end{pmatrix}=\begin{pmatrix}2 \\2\end{pmatrix}$ is not in $G$, also showing $G$ is not a linear space. But so long as you can show one condition fails, you can deduce $G$ is not a linear space. ## An comprehensive example. **Example.** Consider the set $G = \{a+(a+b)x+bx^{2}:a,b\in \mathbb{R}\}$. (1) Show $G$ is a linear space. (2) Show $\beta = \{1+x,x+x^{2}\}$ is a basis for $G$. (3) Is $\gamma = \{1+x+x^{2},x\}$ a basis for $G$? (4) What is $\dim(G)$? (1) Note that we can re-write $G=\{a(1+x)+b(x+x^{2}):a,b\in \mathbb{R}\}=\operatorname{span}(1+x,x+x^{2})$, hence $G$ is a linear space. (2) To show $\beta=\{1+x,x+x^{2}\}$ is a basis for $G$, we need to show $\beta$ is a spanning set for $G$, and that $\beta$ is linearly independent. Showing $\operatorname{span}(\beta)=G$. We show mutual containment. Take $p\in \operatorname{span}(\beta)$, then $p=c_{1}(1+x)+c_{2}(x+x^{2})$ for some $c_{1},c_{2}$. But note $p=c_{1}+(c_{1}+c_{2})x+c_{2}x^{2}$, which matches what things in $G$ look like. Hence $p \in G$. This shows $\operatorname{span}(\beta)\subset G$. Take $p \in G$, then $p=a+(a+b)x+bx^{2}$ for some $a,b \in \mathbb{R}$. But we can write $p=a(1+x)+b(x+x^{2})$, which shows $p\in \operatorname{span}(\beta)$. Hence $p\in \operatorname{span}(\beta)$. This shows $G\subset \operatorname{span}(\beta)$. Hence $\operatorname{span}(\beta)=G$, in other words, $\beta$ is a spanning set of $G$. Showing $\beta$ is linearly independent. Let us check the homogeneous equation $$ c_{1}(1+x) + c_{2}(x+x^{2}) = 0. $$This says $c_{1}+(c_{1}+c_{2})x+c_{2}x^{2}=0$. By matching coefficients by powers of $x$ gives the equations $$ \begin{align*} c_{1} = 0 \\ c_{1}+c_{2}=0 \\ c_{2} =0 \end{align*} $$which shows we must have $c_{1}=c_{2}=0$. Hence $\beta$ is linearly independent. So together, we have $\beta$ is a basis set for $G$. (3) For $\gamma$ to be basis for $G$, we need first and foremost $\gamma \subset G$. But is the element $1+x+x^{2}$ from $\gamma$ even in $G$? If it is, then we can write $$ 1+x+x^{2} = a + (a+b)x + bx^{2} . $$But this gives the system $$ \begin{align*} 1 = a \\ 1=a+b \\ 1=b \end{align*} $$which gives $1=2$, a contradiction. Hence $1+x+x^{2}$ is not even in $G$ ! Hence $\gamma$ cannot be a basis for $G$. (4) Once we know a basis set for the linear space $G$, we already know its dimension by counting. Indeed, from (2), we see that $\beta$ is a basis for $G$ and $\beta$ has 2 elements in it. Hence $\dim(G)=2$.